Web Survey Bibliography
In Web surveys, rating scales measuring the respondents’ attitudes and self-descriptions by means of a series of related statements are commonly presented in grid (or matrix) questions. Despite the benefits of displaying multiple rating scale items neatly arranged and supposedly easy to complete on a single screen, respondents are often tempted to rely on cognitive shortcuts in order to reduce the extent of cognitive and navigational effort required to answer a set of rating scale items. In order to minimize this risk of cognitive shortcuts resulting in satisfying rather than optimal answers, respondents have to be motivated to spend extra time and effort on the attentive and careful processing of rating scales. A wide range of visual and dynamic features are available in interactive Web surveys allowing for visual enhancement and greater interactivity in the presentation of survey questions. To date, however, only a few studies have systematically examined new rating scale designs using data input methods other than conventional radio buttons. In the present study, two different rating scales were designed using drag-and-drop as a more interactive data input method: Respondents have to drag the response options towards the rating scale items (‘drag-response’), or in the reverse direction, the rating scale items towards the response options (‘drag-item’). In both drag-and-drop rating scales, the visual highlighting of the items and response options as well as the dynamic strengthening of the link between these key components are aimed at encouraging the respondents to process a rating scale more attentively and carefully. The effectiveness of the drag-and-drop rating scales in preventing the respondents’ susceptibility to cognitive shortcuts is assessed on the basis of five systematic response tendencies that are typically accompanied by rating scales, i.e., careless, nondifferentiated, acquiescent, and extreme responding as well as the respondents’ systematic tendency to select one of the first response options, so called primacy effects. Moreover, item missing data, response times, and respondent evaluation are examined. The findings of the present study revealed that although both drag-and-drop scales entail a higher level of respondent burden as indicated by an increase in item missing data and longer response times compared to conventional radio button scales, they promote the respondents’ attentiveness and carefulness towards the response task which is accompanied by the respondents’ reduced susceptibility to cognitive shortcuts in processing rating scales.
Ratingskalen zu Erfassung von Einstellungen und Persönlichkeitsmerkmalen des Befragten werden in Online-Befragungen bevorzugt in Form einer Matrixfrage dargestellt. Matrixfragen bieten zwar gewisse Vorzüge hinsichtlich der übersichtlichen Darstellung und einer vermeintlich einfachen Bearbeitung mehrerer Items. Gleichzeitig sind sie jedoch auch anfälliger für systematische Antworttendenzen, die zur Verringerung der Datenqualität führen können. Um dem Risiko derartiger Abkürzungsstrategien entgegenzuwirken, müssen die Befragten zur aufmerksamen und sorgfältigen Bearbeitung von Ratingskalen motiviert werden. Online-Befragungen ermöglichen den Einsatz visueller und interaktiver Elemente zur optischen Aufwertung einzelner Fragen und zur Steigung der Interaktivität des Befragungsprozesses insgesamt. Bislang gibt es jedoch nur wenige Studien, die den Einsatz solcher Gestaltungselemente in Ratingskalen untersuchen. Vor diesem Hintergrund werden im Rahmen der vorliegenden Studie zwei unterschiedliche Drag-and-Drop-Ratingskalen konzipiert: In der Drag-Response-Skala sind die Befragten aufgefordert, mit dem Mauszeiger eine ausgewählte Antwortmöglichkeit zum jeweiligen Item zu ziehen, wohingegen in der Drag-Item-Skala das jeweilige Item zur ausgewählten Antwortmöglichkeit gezogen wird. Durch den Einsatz der Drag-and-Drop Technik soll die Aufmerksamkeit gezielt auf die Items und Antwortmöglichkeiten gelenkt sowie die Verbindung zwischen dem jeweiligen Item und der ausgewählten Antwortmöglichkeit verstärkt werden. Zur Überprüfung der Effektivität der beiden Drag-and-Drop-Ratingskalen hinsichtlich einer aufmerksameren und sorgfältigeren Bearbeitung und letztlich einer Vorbeugung von systematischen Antworttendenzen werden mehrere Indikatoren der Datenqualität herangezogen, darunter ‚Careless Responding‘, ‚Nondifferentiation‘, ‚Acquiescence‘, ‚Extremity‘ sowie ‚Primacy Effekte‘. Darüber hinaus werden das Ausmaß fehlender Werte, die Antwortzeiten und Bewertungen der Befragten ausgewertet. Die Ergebnisse der vorliegenden Untersuchung zeigen, dass die Drag-and-Drop-Ratingskalen zwar mit einem gesteigerten Aufwand für Kognition und Navigation einhergehen, welcher zu mehr fehlenden Werten und längeren Antwortzeiten führt. Gleichzeitig jedoch werden die Befragten zu einem aufmerksameren und sorgfältigeren Antwortverhalten motiviert, was wiederum systematischen Antworttendenzen entgegenwirkt.
Web survey bibliography (4086)
- App vs. Web for Surveys of Smartphone Users: Experimenting with mobile apps for signal-contingent experience...; 2015; McGeeney, K.; Keeter, S.; Igielnik, R.; Smith, A.; Rainie, L.
- Using Video to Reinvigorate the Open Question; 2015; Cape, P.
- On the Go: How Mobile Participants Affect Survey Results; 2015; Barlas, F. M.; Thomas, R. K.
- The Matrix Lives On: Improving Grids for Online Surveys; 2015; Thomas, R. K.; Barlas, F. M.; Graham, P.; Subias, T.
- Variance Estimation for Surveys from Internet Panels ; 2015; Rivers, D.
- Sensitivity Analysis of Bias of Estimates from Web Surveys with Nonrandomized Panel Selection; 2015; Beresovsky, V.
- Detecting Fraud in a Survey Sample Recruited Online; 2015; Brown, D.; Dever, J. A.; Augustson, E.; Squiers, L.
- Survey Treatments and Response Modes: Bayesian Survival Analysis with Competing Risks; 2015; Minato, H.
- Purposefully Mobile: Experimentally Assessing Device Effects in an Online Survey ; 2015; Barlas, F. M.; Thomas, R. K.; Graham, P.
- Use of Smartphones as a New Survey Mode: A Feasibility Study ; 2015; Hu, S.; Freedner-Maguire, N.; Dayton, J.; Neff, L.
- Using equivalence testing to disentangle selection and measurement in mixed modes surveys ; 2015; Cernat, A.
- What do web survey panel respondents answer when asked “Do you have any other comment?”; 2015; Schonlau, M.
- On Climbing Stairs Many Steps at a Time: The New Normal in Survey Methodology; 2015; Dillman, D. A.
- Mobile Research Methods: Opportunities and challenges of mobile research methodologies. ; 2015; Toninelli, D. (Ed.); Pinter, R.; de Pedraza, P.
- Effect of Web-Based Versus Paper-Based Questionnaires and Follow-Up Strategies on Participation Rates...; 2015; Kilsdonk, E.; van den Heuvel-Eibrink, M. M.; van Dulmen-den Broeder, E.; van der Pal, H. J. H.; van...
- Polling Error in the 2015 UK General Election: An Analysis of YouGov’s Pre and Post-Election Polls...; 2015; Wells, A.; Rivers, D.
- Cell Phone and Face-to-face Interview Responses in Population-based Sur- veys - How Do They Compare?; 2015; Ghandour, L.; Ghandour, B.; Mahfoud, Z.; Mokdad, A.; Sibai, A. M.
- Collecting Health Research Data - Comparing Mobile Phone-assisted Personal Interviewing to Paper-and...; 2015; van Heerden, A. C.; Norris, S. A.; Tollman, S. M.; Richter, L. M.
- The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability...; 2015; Bosnjak, M.; Struminskaya, B.; Weyandt, K.
- Are Sliders Too Slick for Surveys? An Experiment Comparing Slider and Radio Button Scales for Smartphone...; 2015; Aadland, D.; Aalberg, T.
- Evaluation of an Adapted Design in a Multi-device Online Panel: A DemoSCOPE Case Study; 2015; Arn, B.; Klug, S.; Kolodziejski, J.
- Maximizing Data Quality using Mode Switching in Mixed-Device Survey Design: Nonresponse Bias and Models...; 2015; Axinn, W.; Gatny, H. H.; Wagner, J.
- Web Surveys Optimized for Smartphones: Are there Differences Between Computer and Smartphone Users?; 2015; Andreadis, I.
- Usability of the ACS Internet Instrument on Mobile Devices; 2015; Horwitz, R.
- Explorations in Non - Probability Sampling Using the Web; 2015; Brick, J. M.
- On Bias Adjustments for Web Surveys; 2015; Fan, L.; Lou, W.; Landsman, V.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- Web panel surveys – a challenge for official statistics; 2015; Svensson, J.
- Estimation with Non-probability Surveys and the Question of External Validity; 2015; Dever, J. A.; Valliant, R. L.
- Measurement Properties of Web Surveys; 2015; Tourangeau, R.
- Measuring Political Knowledge in Web-Based Surveys: An Experimental Validation of Visual Versus Verbal...; 2015; Munzert, S.; Selb, P.
- Validation of the new scale for measuring behaviors of Facebook users: Psycho-Social Aspects of Facebook...; 2015; Bodroza, B.; Jovanovic, T.
- Adding Postal Follow-Up to a Web-Based Survey of Primary Care and Gastroenterology Clinic Physician...; 2015; Partin, M. R.; Powell, A. A.; Burgess, D. J.; Haggstrom, D. A.; Gravely, A. A.; Halek, K.; Bangerter...
- Can Non-full-probability Internet Surveys Yield Useful Data? A Comparison with Full-probability Face...; 2015; Simmons, A.D.; Bobo, L. D.
- Participation rates, response bias and response behaviours in the community survey of the Swiss Spinal...; 2015; Fekete, C.; Segerer, W.; Gemperli, A.; Brinkhof, M.W.G.
- The Cathie Marsh lecture: What does the failure of the polls tell us about the future of survey research...; 2015; Sturgis, P., Matheson, J.
- GreenBook Research Industry Trends Report; 2015; Murphy, L. (Ed.)
- Designing web surveys for the multi-device internet; 2015; de Bruijne, M.
- Data Quality Standards in Mixed Mode Surveys; 2015; Bremer, J.; Barbulescu, M.; Bennett, J.
- Mobile Surveys for Kids: Making Surveys G-rated; 2015; Simpson, B.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- A Review of Issues in Gamified Surveys; 2015; Keusch, F.; Zhang, Che.
- Mobility Enabled: Effects of Mobile Devices on Survey Response and Substantive Measures; 2015; Barlas, F. M.; Randall, T. K.
- Innovations in Email Invitation Design for Today’s Digital World; 2015; Saunders, T.; Kessler, A.
- Gamification in Survey Research: Do The Results Support The Evangelists?; 2015; Pashupati, K.; Weber-Raley, L.
- What They Can’t See Can Hurt You: Improving Grids for Mobile Devices; 2015; Randall, T. K.; Barlas, F. M.; Graham, P.; Subias, T.
- Same, Same but Different: Effects of mixing Web and mail modes in audience research; 2015; Bergstroem, A.
- Comparison of telephone RDD and online panel survey modes on CPGI scores and co-morbidities; 2015; Lee, C.-K.; Back, K.-J.; Williams, Ro. J.; Ahn, S.-S.
- An Empirical Test of Nonresponse Bias in Internet Surveys; 2015; af Wahlberg, AE; Poom, L.